# Russian-English Bilingual Support
Xlm Roberta Base Intent Twin
MIT
XLM-RoBERTa-base is a multilingual pre-trained model based on the RoBERTa architecture, supporting Russian and English, suitable for text classification tasks.
Text Classification
Transformers Supports Multiple Languages

X
forthisdream
30
1
Sci Rus Tiny
MIT
SciRus-tiny is a compact model for obtaining Russian and English scientific text embeddings, trained on eLibrary data using contrastive learning techniques.
Text Embedding
Transformers Supports Multiple Languages

S
mlsa-iai-msu-lab
369
12
Xlm Roberta Large Qa Multilingual Finedtuned Ru
Apache-2.0
This is a pretrained model based on the XLM-RoBERTa architecture, trained with masked language modeling objectives and fine-tuned on English and Russian question answering datasets.
Question Answering System
Transformers Supports Multiple Languages

X
AlexKay
1,814
48
Featured Recommended AI Models